Skip to content

Conversation

sentrivana
Copy link
Contributor

@sentrivana sentrivana commented Sep 8, 2025

  • Move httpx under toxgen
    • This is not straightforward since we depend on pytest-httpx which has a very weird strict relationship to which httpx versions it supports
    • In addition to that, some pytest-httpx versions will not install on Python 3.8 despite the corresponding httpx versions being compatible -- added a new way to filter out these scenarios
  • This also removes the -latest Networking group as the httpx test suite was the last Networking test suite that wasn't completely pinned
  • Also move gcp to the permanent part of the ignore list as it doesn't fit the toxgen mold

Ref #4506

Copy link

codecov bot commented Sep 8, 2025

❌ 135 Tests Failed:

Tests completed Failed Passed Skipped
23806 135 23671 2309
View the top 3 failed test(s) by shortest run time
::tests.integrations.asyncpg.test_asyncpg
Stack Traces | 0s run time
.tox/py3.6-asyncpg-v0.25.0/lib/python3.6.../site-packages/_pytest/python.py:599: in _importtestmodule
    mod = import_path(self.path, mode=importmode, root=self.config.rootpath)
.tox/py3.6-asyncpg-v0.25.0/lib/python3.6.../site-packages/_pytest/pathlib.py:533: in import_path
    importlib.import_module(module_name)
.../local/lib/python3.6/importlib/__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:994: in _gcd_import
    ???
<frozen importlib._bootstrap>:971: in _find_and_load
    ???
<frozen importlib._bootstrap>:955: in _find_and_load_unlocked
    ???
<frozen importlib._bootstrap>:665: in _load_unlocked
    ???
.tox/py3.6-asyncpg-v0.25.0/lib/python3.6.../_pytest/assertion/rewrite.py:171: in exec_module
    exec(co, module.__dict__)
.../integrations/asyncpg/test_asyncpg.py:23: in <module>
    from sentry_sdk.integrations.asyncpg import AsyncPGIntegration
E     File ".../sentry_sdk/integrations/asyncpg.py", line 1
E       from __future__ import annotations
E                                        ^
E   SyntaxError: future feature annotations is not defined
::tests.integrations.grpc.test_grpc
Stack Traces | 0s run time
ImportError while importing test module '.../integrations/grpc/test_grpc.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
.tox/py3.6-grpc-v1.46.5/lib/python3.6.../site-packages/_pytest/python.py:599: in _importtestmodule
    mod = import_path(self.path, mode=importmode, root=self.config.rootpath)
.tox/py3.6-grpc-v1.46.5/lib/python3.6.../site-packages/_pytest/pathlib.py:533: in import_path
    importlib.import_module(module_name)
.../local/lib/python3.6/importlib/__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:994: in _gcd_import
    ???
<frozen importlib._bootstrap>:971: in _find_and_load
    ???
<frozen importlib._bootstrap>:955: in _find_and_load_unlocked
    ???
<frozen importlib._bootstrap>:665: in _load_unlocked
    ???
.tox/py3.6-grpc-v1.46.5/lib/python3.6.../_pytest/assertion/rewrite.py:171: in exec_module
    exec(co, module.__dict__)
.../integrations/grpc/test_grpc.py:12: in <module>
    from tests.integrations.grpc.grpc_test_service_pb2 import gRPCTestMessage
.../integrations/grpc/grpc_test_service_pb2.py:8: in <module>
    from google.protobuf.internal import builder as _builder
E   ImportError: cannot import name 'builder'
::tests.integrations.grpc.test_grpc_aio
Stack Traces | 0s run time
ImportError while importing test module '.../integrations/grpc/test_grpc_aio.py'.
Hint: make sure your test modules/packages have valid Python names.
Traceback:
.tox/py3.6-grpc-v1.46.5/lib/python3.6.../site-packages/_pytest/python.py:599: in _importtestmodule
    mod = import_path(self.path, mode=importmode, root=self.config.rootpath)
.tox/py3.6-grpc-v1.46.5/lib/python3.6.../site-packages/_pytest/pathlib.py:533: in import_path
    importlib.import_module(module_name)
.../local/lib/python3.6/importlib/__init__.py:126: in import_module
    return _bootstrap._gcd_import(name[level:], package, level)
<frozen importlib._bootstrap>:994: in _gcd_import
    ???
<frozen importlib._bootstrap>:971: in _find_and_load
    ???
<frozen importlib._bootstrap>:955: in _find_and_load_unlocked
    ???
<frozen importlib._bootstrap>:665: in _load_unlocked
    ???
.tox/py3.6-grpc-v1.46.5/lib/python3.6.../_pytest/assertion/rewrite.py:171: in exec_module
    exec(co, module.__dict__)
.../integrations/grpc/test_grpc_aio.py:12: in <module>
    from tests.integrations.grpc.grpc_test_service_pb2 import gRPCTestMessage
.../integrations/grpc/grpc_test_service_pb2.py:8: in <module>
    from google.protobuf.internal import builder as _builder
E   ImportError: cannot import name 'builder'
tests.integrations.spark.test_spark::test_sentry_listener_on_job_end[JobFailed-warning]
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:135: in test_sentry_listener_on_job_end
    assert mock_hub.kwargs["level"] == level
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_job_end[JobFailed-warning]
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:135: in test_sentry_listener_on_job_end
    assert mock_hub.kwargs["level"] == level
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_job_end[JobSucceeded-info]
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:135: in test_sentry_listener_on_job_end
    assert mock_hub.kwargs["level"] == level
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_job_end[JobSucceeded-info]
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:135: in test_sentry_listener_on_job_end
    assert mock_hub.kwargs["level"] == level
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_completed_failure
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:295: in test_sentry_listener_on_stage_completed_failure
    assert mock_hub.kwargs["level"] == "warning"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_completed_failure
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:295: in test_sentry_listener_on_stage_completed_failure
    assert mock_hub.kwargs["level"] == "warning"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_completed_success
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:277: in test_sentry_listener_on_stage_completed_success
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_completed_success
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:277: in test_sentry_listener_on_stage_completed_success
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_submitted
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:165: in test_sentry_listener_on_stage_submitted
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_submitted
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:165: in test_sentry_listener_on_stage_submitted
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_submitted_no_attempt_id
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:196: in test_sentry_listener_on_stage_submitted_no_attempt_id
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_submitted_no_attempt_id
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:196: in test_sentry_listener_on_stage_submitted_no_attempt_id
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_submitted_no_attempt_id_or_number
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:224: in test_sentry_listener_on_stage_submitted_no_attempt_id_or_number
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_stage_submitted_no_attempt_id_or_number
Stack Traces | 0.001s run time
.../integrations/spark/test_spark.py:224: in test_sentry_listener_on_stage_submitted_no_attempt_id_or_number
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_job_start
Stack Traces | 0.002s run time
.../integrations/spark/test_spark.py:106: in test_sentry_listener_on_job_start
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.spark.test_spark::test_sentry_listener_on_job_start
Stack Traces | 0.002s run time
.../integrations/spark/test_spark.py:106: in test_sentry_listener_on_job_start
    assert mock_hub.kwargs["level"] == "info"
E   TypeError: tuple indices must be integers or slices, not str
tests.integrations.anthropic.test_anthropic::test_exception_message_create_async
Stack Traces | 0.054s run time
.../integrations/anthropic/test_anthropic.py:707: in test_exception_message_create_async
    sentry_init(integrations=[AnthropicIntegration()], traces_sample_rate=1.0)
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_exception_message_create
Stack Traces | 0.055s run time
.../integrations/anthropic/test_anthropic.py:687: in test_exception_message_create
    sentry_init(integrations=[AnthropicIntegration()], traces_sample_rate=1.0)
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_nonstreaming_create_message[True-False]
Stack Traces | 0.055s run time
.../integrations/anthropic/test_anthropic.py:85: in test_nonstreaming_create_message
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_nonstreaming_create_message_async[False-False]
Stack Traces | 0.055s run time
.../integrations/anthropic/test_anthropic.py:154: in test_nonstreaming_create_message_async
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_nonstreaming_create_message[False-False]
Stack Traces | 0.056s run time
.../integrations/anthropic/test_anthropic.py:85: in test_nonstreaming_create_message
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_nonstreaming_create_message[False-True]
Stack Traces | 0.056s run time
.../integrations/anthropic/test_anthropic.py:85: in test_nonstreaming_create_message
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_nonstreaming_create_message_async[True-False]
Stack Traces | 0.056s run time
.../integrations/anthropic/test_anthropic.py:154: in test_nonstreaming_create_message_async
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_span_origin_async
Stack Traces | 0.056s run time
.../integrations/anthropic/test_anthropic.py:755: in test_span_origin_async
    traces_sample_rate=1.0,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_nonstreaming_create_message_async[False-True]
Stack Traces | 0.057s run time
.../integrations/anthropic/test_anthropic.py:154: in test_nonstreaming_create_message_async
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_nonstreaming_create_message_async[True-True]
Stack Traces | 0.057s run time
.../integrations/anthropic/test_anthropic.py:154: in test_nonstreaming_create_message_async
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_span_origin
Stack Traces | 0.057s run time
.../integrations/anthropic/test_anthropic.py:728: in test_span_origin
    traces_sample_rate=1.0,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.aiohttp.test_aiohttp::test_outgoing_trace_headers_append_to_baggage[pyloop]
Stack Traces | 0.075s run time
.../integrations/aiohttp/test_aiohttp.py:613: in test_outgoing_trace_headers_append_to_baggage
    release="d08ebdb9309e1b004c6f52202de58a09c2268e42",
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.anthropic.test_anthropic::test_streaming_create_message[True-True]
Stack Traces | 0.083s run time
.../integrations/anthropic/test_anthropic.py:257: in test_streaming_create_message
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_streaming_create_message_async[False-True]
Stack Traces | 0.083s run time
.../integrations/anthropic/test_anthropic.py:361: in test_streaming_create_message_async
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs5-HTTPInternalServerError-False]
Stack Traces | 0.084s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs8-HTTPInternalServerError-True]
Stack Traces | 0.084s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.anthropic.test_anthropic::test_streaming_create_message[False-True]
Stack Traces | 0.084s run time
.../integrations/anthropic/test_anthropic.py:257: in test_streaming_create_message
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_streaming_create_message_async[False-False]
Stack Traces | 0.084s run time
.../integrations/anthropic/test_anthropic.py:361: in test_streaming_create_message_async
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_streaming_create_message_async[True-False]
Stack Traces | 0.084s run time
.../integrations/anthropic/test_anthropic.py:361: in test_streaming_create_message_async
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.anthropic.test_anthropic::test_streaming_create_message_async[True-True]
Stack Traces | 0.084s run time
.../integrations/anthropic/test_anthropic.py:361: in test_streaming_create_message_async
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs1-HTTPBadRequest-False]
Stack Traces | 0.085s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs4-HTTPNetworkAuthenticationRequired-True]
Stack Traces | 0.085s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs6-HTTPNetworkAuthenticationRequired-False]
Stack Traces | 0.085s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_span_origin[pyloop]
Stack Traces | 0.085s run time
.../integrations/aiohttp/test_aiohttp.py:645: in test_span_origin
    traces_sample_rate=1.0,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.anthropic.test_anthropic::test_streaming_create_message[False-False]
Stack Traces | 0.085s run time
.../integrations/anthropic/test_anthropic.py:257: in test_streaming_create_message
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs0-None-False]
Stack Traces | 0.086s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs3-HTTPInternalServerError-True]
Stack Traces | 0.086s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs9-HTTPBadRequest-False]
Stack Traces | 0.086s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes_non_http_exception[pyloop]
Stack Traces | 0.086s run time
.../integrations/aiohttp/test_aiohttp.py:770: in test_failed_request_status_codes_non_http_exception
    sentry_init(integrations=[AioHttpIntegration(failed_request_status_codes=set())])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_outgoing_trace_headers[pyloop]
Stack Traces | 0.086s run time
.../integrations/aiohttp/test_aiohttp.py:579: in test_outgoing_trace_headers
    traces_sample_rate=1.0,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_tracing[pyloop]
Stack Traces | 0.086s run time
.../integrations/aiohttp/test_aiohttp.py:191: in test_tracing
    sentry_init(integrations=[AioHttpIntegration()], traces_sample_rate=1.0)
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.anthropic.test_anthropic::test_streaming_create_message[True-False]
Stack Traces | 0.086s run time
.../integrations/anthropic/test_anthropic.py:257: in test_streaming_create_message
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.aiohttp.test_aiohttp::test_403_not_captured[pyloop]
Stack Traces | 0.087s run time
.../integrations/aiohttp/test_aiohttp.py:129: in test_403_not_captured
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_crumb_capture_client_error[pyloop-500-error]
Stack Traces | 0.087s run time
.../integrations/aiohttp/test_aiohttp.py:540: in test_crumb_capture_client_error
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs7-HTTPNotFound-True]
Stack Traces | 0.087s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_post_body_not_read[pyloop]
Stack Traces | 0.087s run time
.../integrations/aiohttp/test_aiohttp.py:72: in test_post_body_not_read
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_post_body_read[pyloop]
Stack Traces | 0.087s run time
.../integrations/aiohttp/test_aiohttp.py:100: in test_post_body_read
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_trace_from_headers_if_performance_enabled[pyloop]
Stack Traces | 0.087s run time
.../integrations/aiohttp/test_aiohttp.py:402: in test_trace_from_headers_if_performance_enabled
    sentry_init(integrations=[AioHttpIntegration()], traces_sample_rate=1.0)
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_crumb_capture_client_error[pyloop-200-None]
Stack Traces | 0.088s run time
.../integrations/aiohttp/test_aiohttp.py:540: in test_crumb_capture_client_error
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_crumb_capture_client_error[pyloop-403-warning]
Stack Traces | 0.088s run time
.../integrations/aiohttp/test_aiohttp.py:540: in test_crumb_capture_client_error
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes[pyloop-integration_kwargs2-exception_to_raise2-False]
Stack Traces | 0.088s run time
.../integrations/aiohttp/test_aiohttp.py:712: in test_failed_request_status_codes
    sentry_init(integrations=[AioHttpIntegration(**integration_kwargs)])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_half_initialized[pyloop]
Stack Traces | 0.088s run time
.../integrations/aiohttp/test_aiohttp.py:171: in test_half_initialized
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_cancelled_error_not_captured[pyloop]
Stack Traces | 0.089s run time
.../integrations/aiohttp/test_aiohttp.py:150: in test_cancelled_error_not_captured
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_crumb_capture_client_error[pyloop-301-None]
Stack Traces | 0.089s run time
.../integrations/aiohttp/test_aiohttp.py:540: in test_crumb_capture_client_error
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_tracing_unparseable_url[pyloop]
Stack Traces | 0.089s run time
.../integrations/aiohttp/test_aiohttp.py:268: in test_tracing_unparseable_url
    sentry_init(integrations=[AioHttpIntegration()], traces_sample_rate=1.0)
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_crumb_capture_client_error[pyloop-405-warning]
Stack Traces | 0.09s run time
.../integrations/aiohttp/test_aiohttp.py:540: in test_crumb_capture_client_error
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_failed_request_status_codes_with_returned_status[pyloop]
Stack Traces | 0.091s run time
.../integrations/aiohttp/test_aiohttp.py:746: in test_failed_request_status_codes_with_returned_status
    sentry_init(integrations=[AioHttpIntegration(failed_request_status_codes={500})])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_has_trace_if_performance_enabled[pyloop]
Stack Traces | 0.091s run time
.../integrations/aiohttp/test_aiohttp.py:332: in test_has_trace_if_performance_enabled
    sentry_init(integrations=[AioHttpIntegration()], traces_sample_rate=1.0)
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_has_trace_if_performance_disabled[pyloop]
Stack Traces | 0.092s run time
.../integrations/aiohttp/test_aiohttp.py:369: in test_has_trace_if_performance_disabled
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_transaction_style[pyloop-/message-method_and_path_pattern-GET /{var}-route]
Stack Traces | 0.092s run time
.../integrations/aiohttp/test_aiohttp.py:243: in test_transaction_style
    traces_sample_rate=1.0,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_crumb_capture[pyloop]
Stack Traces | 0.094s run time
.../integrations/aiohttp/test_aiohttp.py:487: in test_crumb_capture
    integrations=[AioHttpIntegration()], before_breadcrumb=before_breadcrumb
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.aiohttp.test_aiohttp::test_transaction_style[pyloop-/message-handler_name-tests.integrations.aiohttp.test_aiohttp.test_transaction_style.<locals>.hello-component]
Stack Traces | 0.094s run time
.../integrations/aiohttp/test_aiohttp.py:243: in test_transaction_style
    traces_sample_rate=1.0,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.starlite.test_starlite::test_middleware_receive_send
Stack Traces | 0.094s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:235: in test_middleware_receive_send
    starlite_app = starlite_app_factory(middleware=[SampleReceiveSendMiddleware])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.aiohttp.test_aiohttp::test_traces_sampler_gets_request_object_in_sampling_context[pyloop]
Stack Traces | 0.096s run time
.../integrations/aiohttp/test_aiohttp.py:305: in test_traces_sampler_gets_request_object_in_sampling_context
    traces_sampler=traces_sampler,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.starlite.test_starlite::test_middleware_callback_spans
Stack Traces | 0.096s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:172: in test_middleware_callback_spans
    starlite_app = starlite_app_factory(middleware=[SampleMiddleware])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.starlite.test_starlite::test_middleware_partial_receive_send
Stack Traces | 0.096s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:267: in test_middleware_partial_receive_send
    starlite_app = starlite_app_factory(middleware=[SamplePartialReceiveSendMiddleware])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.starlite.test_starlite::test_span_origin
Stack Traces | 0.096s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:325: in test_span_origin
    starlite_app = starlite_app_factory(
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.starlite.test_starlite::test_starlite_scope_user_on_exception_event[send_default_pii=False]
Stack Traces | 0.097s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:372: in test_starlite_scope_user_on_exception_event
    starlite_app = starlite_app_factory(middleware=[TestUserMiddleware])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.starlite.test_starlite::test_catch_exceptions[/controller/error-Exception-Whoa-partial(<function tests.integrations.starlite.test_starlite.starlite_app_factory.<locals>.MyController.controller_error>)]
Stack Traces | 0.098s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:96: in test_catch_exceptions
    starlite_app = starlite_app_factory()
                   ^^^^^^^^^^^^^^^^^^^^^^
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.starlite.test_starlite::test_starlite_scope_user_on_exception_event[send_default_pii=True]
Stack Traces | 0.098s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:372: in test_starlite_scope_user_on_exception_event
    starlite_app = starlite_app_factory(middleware=[TestUserMiddleware])
                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.aiohttp.test_aiohttp::test_trace_from_headers_if_performance_disabled[pyloop]
Stack Traces | 0.099s run time
.../integrations/aiohttp/test_aiohttp.py:446: in test_trace_from_headers_if_performance_disabled
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.starlite.test_starlite::test_catch_exceptions[/custom_error-Exception-Too Hot-custom_name]
Stack Traces | 0.099s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:96: in test_catch_exceptions
    starlite_app = starlite_app_factory()
                   ^^^^^^^^^^^^^^^^^^^^^^
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.starlite.test_starlite::test_middleware_spans
Stack Traces | 0.099s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:125: in test_middleware_spans
    starlite_app = starlite_app_factory(
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.aiohttp.test_aiohttp::test_basic[pyloop]
Stack Traces | 0.115s run time
.../integrations/aiohttp/test_aiohttp.py:27: in test_basic
    sentry_init(integrations=[AioHttpIntegration()])
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/aiohttp.py:102: in setup_once
    " or aiocontextvars package." + CONTEXTVARS_ERROR_MESSAGE
E   sentry_sdk.integrations.DidNotEnable: The aiohttp integration for Sentry requires Python 3.7+  or aiocontextvars package.
E   
E   With asyncio/ASGI applications, the Sentry SDK requires a functional
E   installation of `contextvars` to avoid leaking scope/context data across
E   requests.
E   
E   Please refer to https://docs.sentry..../platforms/python/contextvars/ for more information.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_with_tools[False-True]
Stack Traces | 0.136s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:692: in test_chat_completion_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed4-6de6e7cc7d21244a52092e07;8c924ef8-63da-43ed-afd0-c6e15bb17438)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_with_tools[True-False]
Stack Traces | 0.136s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:692: in test_chat_completion_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed3-65d2927d677c76125e300975;9467378b-3024-4c3a-8124-a51a6f494e2f)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming_with_tools[True-False]
Stack Traces | 0.137s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:770: in test_chat_completion_streaming_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed5-29a13c3d1a5278d12b913fbe;59ca1da6-0ef3-48da-9b3f-74a928d318aa)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_with_tools[True-True]
Stack Traces | 0.137s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:692: in test_chat_completion_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed3-0ea0478b7047623051a00bee;96ed917a-1ea2-4621-8150-04647977cff8)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion[False-False]
Stack Traces | 0.138s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:512: in test_chat_completion
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed1-569f3c613000b2c1605a94f2;a0354521-9df2-4c49-8918-4cc550fd2f44)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion[False-True]
Stack Traces | 0.138s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:512: in test_chat_completion
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed1-371e17cc44d42f9e65127430;e426b2a0-7aab-4e39-988a-d74e891e148a)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion[True-False]
Stack Traces | 0.138s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:512: in test_chat_completion
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed0-538962681c2cb6f676d23927;6a856b43-11c1-42ab-9ef8-5f149e2d1f20)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming_with_tools[False-False]
Stack Traces | 0.138s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:770: in test_chat_completion_streaming_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed6-0162cf326f96c8396eba642e;ed6b64a3-4989-4b5e-a338-497b581609de)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion[True-True]
Stack Traces | 0.139s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:512: in test_chat_completion
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed0-5dbb4f640ca16066535f50c0;ec7b400a-b65d-4968-868a-9e7002ffe60f)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming[False-True]
Stack Traces | 0.139s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:573: in test_chat_completion_streaming
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed2-235611f23bcfa28b0b6ab226;fb229292-3e4d-4d94-9e03-4abfa2d0aad3)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming[True-False]
Stack Traces | 0.139s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:573: in test_chat_completion_streaming
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed2-04f01b237382687d27de0148;92561bd5-5056-40ed-85f9-d83567900108)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_with_tools[False-False]
Stack Traces | 0.139s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:692: in test_chat_completion_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed4-17e356496f9050ff07f198da;e6a4649d-818d-4a5c-96c2-6e037d24284e)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion[True-False]
Stack Traces | 0.145s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:512: in test_chat_completion
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb9-23aadfa2106c6add74901887;40e0845f-bd76-4f21-af26-ec597f4dff63)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming[True-False]
Stack Traces | 0.145s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:573: in test_chat_completion_streaming
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebb-0e8bd540080e8ac5636770b4;1c0afb5d-bf53-4cb4-8974-897d72bd7e25)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_with_tools[True-True]
Stack Traces | 0.145s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:692: in test_chat_completion_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebc-252a07b05a8caaaf6cfd022f;2b275e97-16c5-478d-8c6b-168ca2cca591)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion[True-True]
Stack Traces | 0.146s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:512: in test_chat_completion
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb9-3d35b7b56d3125b020bbfcd1;b406ea62-e245-40c5-a5ff-1a92f23c7f91)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming[False-False]
Stack Traces | 0.146s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:573: in test_chat_completion_streaming
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebc-008addb64090f8e943025867;d9003444-67ee-44a3-94e1-ba58f85a18cf)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming_with_tools[False-True]
Stack Traces | 0.146s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:770: in test_chat_completion_streaming_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebf-0c6342e17593d2cb7603ab7d;a58f4b2e-e869-4659-abcc-e7f6e46450f5)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation_streaming[False-False]
Stack Traces | 0.146s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:454: in test_text_generation_streaming
    for _ in client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb8-3c7290c86a2fea201553cc00;ba6595d0-a697-45f3-bed8-c22593e5c0f3)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_with_tools[False-False]
Stack Traces | 0.147s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:692: in test_chat_completion_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebe-7261b9b065a169b5540d620c;78e8580c-05d7-44dc-b867-c31160e6968e)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion[False-False]
Stack Traces | 0.148s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:512: in test_chat_completion
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eba-73f95ee548ea033756b1fdd9;f3f81fe9-75a9-41ed-b3b8-53b885c310b1)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion[False-True]
Stack Traces | 0.15s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:512: in test_chat_completion
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eba-176e5ce1313fb96b5f8ca320;8d270645-9c19-49c3-9f57-9ccbaeb87515)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_api_error
Stack Traces | 0.15s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:637: in test_chat_completion_api_error
    (span,) = transaction["spans"]
    ^^^^^^^
E   ValueError: too many values to unpack (expected 1)
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_with_tools[False-True]
Stack Traces | 0.153s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:692: in test_chat_completion_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebd-312e76db6b8aaa3d2ed8c59b;5adc22f6-a96c-478a-af35-892ff2d8746d)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation_streaming[False-False]
Stack Traces | 0.155s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:454: in test_text_generation_streaming
    for _ in client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed0-407cc8c40f9ecdd75d96043a;d10fdc86-d691-4552-9d92-3692a30ac3b7)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming_with_tools[True-True]
Stack Traces | 0.159s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:770: in test_chat_completion_streaming_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed5-56dac29977726bae4dd90f20;821abeb8-3629-4041-aaa3-9e323bed11ea)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation_streaming[True-False]
Stack Traces | 0.159s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:454: in test_text_generation_streaming
    for _ in client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb8-704c506d62f63ff6275b3dde;35568161-112b-4f38-b455-e4765e5f09b2)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_api_error
Stack Traces | 0.161s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:637: in test_chat_completion_api_error
    (span,) = transaction["spans"]
    ^^^^^^^
E   ValueError: too many values to unpack (expected 1)
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming_with_tools[True-True]
Stack Traces | 0.162s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:770: in test_chat_completion_streaming_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebe-5844cce46566dbbf3584af6d;63d0a4ed-22b7-4ec5-860d-c501b30dcdc4)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming[True-True]
Stack Traces | 0.174s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:573: in test_chat_completion_streaming
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebb-41ea42a107ce04a86642749b;f031e0ec-b308-4ec1-bffc-5d2163c7baff)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation_streaming[False-True]
Stack Traces | 0.174s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:454: in test_text_generation_streaming
    for _ in client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb8-1e229dfd4bc05c390ee9cb65;863d866d-47d8-4e29-9016-24dfe19a76e9)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming[False-True]
Stack Traces | 0.178s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:573: in test_chat_completion_streaming
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebb-3bee4d6408c235cb4af3383b;2b529f1c-32bd-4efa-8109-5a5d8a112fe8)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming[True-True]
Stack Traces | 0.179s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:573: in test_chat_completion_streaming
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed1-6f9013733af522494016b60e;e4f310ff-7122-436e-ace1-ea9887e84adf)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_with_tools[True-False]
Stack Traces | 0.18s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:692: in test_chat_completion_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebd-0d06c5ec019cb8987d12655e;2d60aed8-0723-4bf3-9f43-1ac464929920)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming_with_tools[False-True]
Stack Traces | 0.182s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:770: in test_chat_completion_streaming_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed5-2941f5344de28296328543f5;42f17b18-7c49-4810-b879-a94c37c8df85)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming_with_tools[False-False]
Stack Traces | 0.189s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:770: in test_chat_completion_streaming_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ec0-414e31d8104a57cd56e1893f;090a4701-24a9-4b99-a0ce-561e07ea4323)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation_streaming[True-False]
Stack Traces | 0.2s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:454: in test_text_generation_streaming
    for _ in client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ece-26cab3c94af6e1f96005fab9;30e9bc43-a6d8-4d65-a93a-f4f470315656)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation[False-True]
Stack Traces | 0.203s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:397: in test_text_generation
    client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ecc-448961e95a5873844f2b39ff;7a98c1a0-3714-4048-a33d-f98ee0091b70)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation[True-False]
Stack Traces | 0.205s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:397: in test_text_generation
    client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ecb-10b474d77111c0fd3688bea7;d0b66991-a448-46e7-9f91-4fad0cec58eb)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation_streaming[True-True]
Stack Traces | 0.209s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:454: in test_text_generation_streaming
    for _ in client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ecd-67cd5c444c1448214f4ebb54;314bcb68-9fd7-463b-b2ac-6348ba739b2c)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.anthropic.test_anthropic::test_nonstreaming_create_message[True-True]
Stack Traces | 0.21s run time
.../integrations/anthropic/test_anthropic.py:85: in test_nonstreaming_create_message
    send_default_pii=send_default_pii,
tests/conftest.py:209: in inner
    client = sentry_sdk.Client(*a, **kw)
sentry_sdk/client.py:284: in __init__
    self._init_impl()
sentry_sdk/client.py:420: in _init_impl
    disabled_integrations=self.options["disabled_integrations"],
sentry_sdk/integrations/__init__.py:219: in setup_integrations
    type(integration).setup_once()
sentry_sdk/integrations/huggingface_hub.py:47: in setup_once
    huggingface_hub.inference._client.InferenceClient.chat_completion,
E   AttributeError: type object 'InferenceClient' has no attribute 'chat_completion'
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming[False-False]
Stack Traces | 0.212s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:573: in test_chat_completion_streaming
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ed3-6df19cc4357e8756036a9fca;11ff319c-f625-4dfe-9013-4c078f5411a3)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation[False-True]
Stack Traces | 0.213s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:397: in test_text_generation
    client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb5-70c75ecd6a8e2d7e3e906e78;8fb35741-704c-4d16-a1c0-2cab19388d47)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation[False-False]
Stack Traces | 0.217s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:397: in test_text_generation
    client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb6-0c784cae2aa66f12474fc637;c861484f-f999-4b09-b582-41b1c8c09ea8)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation[True-False]
Stack Traces | 0.217s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:397: in test_text_generation
    client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb5-5506e7bc1c0c2f1871fd3b29;4e838481-4247-46a0-9d42-8252eed51421)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_chat_completion_streaming_with_tools[True-False]
Stack Traces | 0.225s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:770: in test_chat_completion_streaming_with_tools
    client.chat_completion(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:897: in chat_completion
    provider_helper = get_provider_helper(
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ebf-7549297417195cf11030da7d;aebaed22-5d61-4b7c-826a-72be1e76b3b5)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation_streaming[True-True]
Stack Traces | 0.236s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:454: in test_text_generation_streaming
    for _ in client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb7-641dd2fa3c848a0d7733ea39;719e37bf-9abd-4e70-a99c-459de7e86f13)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.starlite.test_starlite::test_catch_exceptions[/some_url-ZeroDivisionError-division by zero-tests.integrations.starlite.test_starlite.starlite_app_factory.<locals>.homepage_handler]
Stack Traces | 0.254s run time
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:608: in configure
    handler = self.configure_handler(handlers[name])
              ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:876: in configure_handler
    result = factory(**kwargs)
             ^^^^^^^^^^^^^^^^^
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:771: in _configure_queue_handler
    handler = klass(q, **kwargs)
              ^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/standard.py:21: in __init__
    handlers = resolve_handlers(handlers)
               ^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/logging/utils.py:18: in resolve_handlers
    return [handlers[i] for i in range(len(handlers))]
                                       ^^^^^^^^^^^^^
E   TypeError: object of type 'Queue' has no len()

The above exception was the direct cause of the following exception:
.../integrations/starlite/test_starlite.py:96: in test_catch_exceptions
    starlite_app = starlite_app_factory()
                   ^^^^^^^^^^^^^^^^^^^^^^
.../integrations/starlite/test_starlite.py:46: in starlite_app_factory
    app = Starlite(
sentry_sdk/utils.py:1816: in runner
    return sentry_patched_function(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
sentry_sdk/integrations/starlite.py:100: in injection_wrapper
    old__init__(self, *args, **kwargs)
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../site-packages/starlite/app.py:385: in __init__
    self.get_logger = self.logging_config.configure()
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-starlite-v1.51.16/lib/python3.12.../starlite/config/logging.py:200: in configure
    config.dictConfig(values)
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:942: in dictConfig
    dictConfigClass(config).configure()
.../hostedtoolcache/Python/3.12.11.............../x64/lib/python3.12/logging/config.py:615: in configure
    raise ValueError('Unable to configure handler '
E   ValueError: Unable to configure handler 'queue_listener'
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation[False-False]
Stack Traces | 0.283s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:397: in test_text_generation
    client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ecd-13f045f37d9dc0a605a63df4;2e687653-4540-4c98-8628-f3fe53781dc3)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation[True-True]
Stack Traces | 0.398s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:397: in test_text_generation
    client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ecb-28db0bfb205e6474108ecdc5;a44fb5ec-3a30-43ab-a22d-a8b1aad74e5b)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation[True-True]
Stack Traces | 0.407s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:397: in test_text_generation
    client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.13-huggingface_hub-v1.0.0rc0/lib/python3.13.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82eb4-25a720c205e07b3856fbce3e;0370ec71-5ad2-4579-a82e-61cdd32d1cba)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.
tests.integrations.huggingface_hub.test_huggingface_hub::test_text_generation_streaming[False-True]
Stack Traces | 1.11s run time
.../integrations/huggingface_hub/test_huggingface_hub.py:454: in test_text_generation_streaming
    for _ in client.text_generation(
sentry_sdk/integrations/huggingface_hub.py:134: in new_huggingface_task
    raise e from None
sentry_sdk/integrations/huggingface_hub.py:128: in new_huggingface_task
    res = f(*args, **kwargs)
          ^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/inference/_client.py:2386: in text_generation
    provider_helper = get_provider_helper(self.provider, task="text-generation", model=model_id)
                      ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/__init__.py:195: in get_provider_helper
    provider_mapping = _fetch_inference_provider_mapping(model)
                       ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../inference/_providers/_common.py:304: in _fetch_inference_provider_mapping
    info = HfApi().model_info(model, expand=["inferenceProviderMapping"])
           ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_validators.py:89: in _inner_fn
    return fn(*args, **kwargs)
           ^^^^^^^^^^^^^^^^^^^
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../site-packages/huggingface_hub/hf_api.py:2544: in model_info
    hf_raise_for_status(r)
.tox/py3.12-huggingface_hub-v1.0.0rc0/lib/python3.12.../huggingface_hub/utils/_http.py:613: in hf_raise_for_status
    raise _format(RepositoryNotFoundError, message, response) from e
E   huggingface_hub.errors.RepositoryNotFoundError: 401 Client Error. (Request ID: Root=1-68c82ece-0a1077026ea7a84c0d7b96de;6a2b3731-3236-4978-b29c-f9b8faab9710)
E   
E   Repository Not Found for url: https://huggingface..../api/models/test-model?expand=inferenceProviderMapping.
E   Please make sure you specified the correct `repo_id` and `repo_type`.
E   If you are trying to access a private or gated repo, make sure you are authenticated. For more details, see https://huggingface..../docs/huggingface_hub/authentication
E   Invalid username or password.

To view more test analytics, go to the Test Analytics Dashboard
📋 Got 3 mins? Take this short survey to help us improve Test Analytics.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant